Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
null (Ed.)ABSTRACT Introduction Short response time is critical for future military medical operations in austere settings or remote areas. Such effective patient care at the point of injury can greatly benefit from the integration of semi-autonomous robotic systems. To achieve autonomy, robots would require massive libraries of maneuvers collected with the goal of training machine learning algorithms. Although this is attainable in controlled settings, obtaining surgical data in austere settings can be difficult. Hence, in this article, we present the Dexterous Surgical Skill (DESK) database for knowledge transfer between robots. The peg transfer task was selected as it is one of the six main tasks of laparoscopic training. In addition, we provide a machine learning framework to evaluate novel transfer learning methodologies on this database. Methods A set of surgical gestures was collected for a peg transfer task, composed of seven atomic maneuvers referred to as surgemes. The collected Dexterous Surgical Skill dataset comprises a set of surgical robotic skills using the four robotic platforms: Taurus II, simulated Taurus II, YuMi, and the da Vinci Research Kit. Then, we explored two different learning scenarios: no-transfer and domain-transfer. In the no-transfer scenario, the training and testing data were obtained from the same domain; whereas in the domain-transfer scenario, the training data are a blend of simulated and real robot data, which are tested on a real robot. Results Using simulation data to train the learning algorithms enhances the performance on the real robot where limited or no real data are available. The transfer model showed an accuracy of 81% for the YuMi robot when the ratio of real-tosimulated data were 22% to 78%. For the Taurus II and the da Vinci, the model showed an accuracy of 97.5% and 93%, respectively, training only with simulation data. Conclusions The results indicate that simulation can be used to augment training data to enhance the performance of learned models in real scenarios. This shows potential for the future use of surgical data from the operating room in deployable surgical robots in remote areas.more » « less
-
Individuals who are blind adopt multiple procedures to tactually explore images. Automatically recognizing and classifying users’ exploration behaviors is the first step towards the development of an intelligent system that could assist users to explore images more efficiently. In this paper, a computational framework was developed to classify different procedures used by blind users during image exploration. Translation-, rotationand scale-invariant features were extracted from the trajectories of users movements. These features were divided as numerical and logical features and were fed into neural networks. More specifically, we trained spiking neural networks (SNNs) to further encode the numerical features as model strings. The proposed framework employed a distance-based classification scheme to determine the final class/label of the exploratory procedures. Dempster-Shafter Theory (DST) was applied to integrate the distances obtained from all the features. Through the experiments of different dynamics of spiking neurons, the proposed framework achieved a good performance with 95.89% classification accuracy. It is extremely effective in encoding and classifying spatio-temporal data, as compared to Dynamic Time Warping and Hidden Markov Model with 61.30% and 28.70% accuracy. The proposed framework serves as the fundamental block for the development of intelligent interfaces, enhancing the image exploration experience for the blind.more » « less
An official website of the United States government

Full Text Available